Web Survey Bibliography
Due to the fact, that at Universities in general all employees should be familiar with running a computer, it is assumed that an online-based interview among all university lecturers should represent a good alternative to the traditional paper and pencil form particularly with regard to costs terms. However, the effect this specific interview method may produce on the quality of the resulting sample as well as on the quality of the resulting answers was not yet clear. The following results from an employee survey of the University of Bremen shall give an answer to these questions.
Between November 2002 and February 2003 all university lecturers were invited to participate in an interview concerning the actual job situation and job satisfaction at the University. Determined at random, half of the interviewees got a traditional questionnaire whereas the other half was asked to fill in an online questionnaire. Both questionnaires were identical in order to estimate the actual effect of the different interview methods. In consequence this meant, that the questionnaire was elaborated for the traditional method but it was not optimised for the online one.
The results were analysed in two respects: with regard of the random sample and with regard of the content. The rate of return of about 47% already revealed differences due to specific interview methods: The rate of return of the traditional questionnaire was about 50% whereas only 43,4% of the online questionnaires were returned. Both rates are acceptable if one considers, that the binding character of the interview was not more than the simple interest of one employee of the University.
Analysing the two samples with regard of sex and position of the interviewees it appeared, that the rate of return among the women was higher than those of the men. As far as the position was concerned there was an effect of the interview method. The rate of return of traditional questionnaires among professors was higher than those of the online questionnaires. This effect could also be interpreted as an effect due to the age of the interviewees.
The analysis of the content should give answers to three main questions: Are there effects on interview methods concerning the response distribution of the questions? Are there effects on the degree of non-response? Are there specific effects on the answers of free-response questions?
The main result was that both interview types delivered equivalent quality of data. Nevertheless, some characteristics must be kept in mind.
Internetgestützte Befragungen der Lehrenden an Universitäten müssten durch die große Verbreitung der Technik und ihre selbstverständliche Nutzung eine gute Möglichkeit sein, relativ kurzfristig und kostengünstig durchführbar sein. Unklar blieben dabei bisher die Auswirkungen der spezifischen Erhebungsmethode auf die Güte der resultierenden Stichprobe und auf die Güte der resultierenden Antworten. Diesen Fragen wurde im Rahmen einer Mitarbeiterbefragung an der Universität Bremen nachgegangen.
Zwischen November 2002 und Februar 2003 waren alle Lehrenden aufgefordert, an einer Befragung zur Arbeitssituation und Arbeitszufriedenheit an der Universität teilzunehmen. Zufällig differenziert konnte die eine Hälfte der Angesprochenen den Fragebogen schriftlich beantworten, während die andere Hälfte gebeten wurde, den Fragebogen im Internet zu beantworten. Um den tatsächlichen Effekt der Erhebungsmethode zu schätzen war der Fragebogen für beide Varianten konstant gehalten. Das bedeutete in der Konsequenz, dass er zwar für die schriftliche Fassung, nicht aber für die netzgestützte optimiert war.
Analysiert wurden die Ergebnisse in zwei Hinsichten: mit Blick auf die Stichprobe und inhaltlich. Bei einer Gesamtrücklaufquote von fast 47% zeigten sich erhebungsspezifische Unterschiede: für die schriftliche Befragung ist ein Rücklauf von 50% zu berichten; für die netzgestützte Variante von 43,4%. Beide Rückläufe sind vor dem Hintergrund der Verbindlichkeit der Teilnahme, die sich lediglich aus einem Forschungsinteresse einer Universitätsmitarbeiterin speiste als akzeptabel einzuschätzen.
Betrachtet man die beiden Stichproben nach dem Geschlecht und der Position der Befragten zeigt sich darüber hinaus, dass für Frauen in beiden Befragungsformen ein höherer Rücklauf konstatiert werden kann als für Männer und dass es einen Erhebungsformeffekt für die Position gibt. In der schriftlichen Variante haben Professoren einen deutlich höheren Rücklauf als andere Lehrenden, in der Online-Variante kehrt sich dieser Befund um. Dies kann auch als Alterseffekt interpretiert werden.
Bezogen auf die inhaltliche Fragestellung sollen insbesondere drei Fragen beantwortet werden: Gibt es Erhebungsformeffekte auf die Antwortverteilungen der Fragen? Gibt es Auswirkungen auf das Ausmaß fehlender Fälle? Gibt es spezifische Effekte auf die Beantwortung offener Fragen?
Im Ergebnis wird festgestellt werden, dass netzgestützte Lehrendenbefragungen zu einer äquivalenten Datenqualität wie schriftliche Befragungen führen, dass aber gleichzeitig einige Besonderheiten zu beachten sind.
Homepage - conference (abstract)
Web survey bibliography (367)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.
- Analyzing Survey Characteristics, Participation, and Evaluation Across 186 Surveys in an Online Opt-...; 2017; Revilla, M.
- Careless Response and Attrition as Sources of Bias in Online Survey Assessments of Personality Traits...; 2017; Meade, A. W.; Ward, M. K.; Alfred, C. M.; Pappalardo, G.; Stoughton, J. W.
- Do Incentives Increase Response Rates to an Internet Survey of American Evaluation Association Members...; 2017; Wilson, L. N.
- Examining Completion Rates in Web Surveys via Over 25,000 Real-World Surveys; 2017; Liu, M.; Wronski, L.
- Data collection mode differences between national face-to-face and web surveys on gender inequality...; 2017; Liu, M.
- Improving survey response rates: The effect of embedded questions in web survey email Invitations; 2017; Liu, M.; Inchausti, N.
- An experimental comparison of web-push vs. paper-only survey procedures for conducting an in-depth health...; 2017; McMaster, H. S.; LeardMann, C. A.; Speigle, S.; Dillman, D. A.
- Demographic Question Placement: Effect on Item Response Rates and Means of a Veterans Health Administration...; 2017; Teclaw, R.; Price, M.; Osatuke, K.
- Effects of Applying Multimedia and Dialogue Box to Web Survey Design; 2017; Chen, H.
- Role of online survey tools in creating temporally accurate Environmental Product Declarations (EPD)...; 2017; Ganguly, I.; Bowers, T.; Pierobon, F.; Eastin, I.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- A Partially Successful Attempt to Integrate a Web-Recruited Cohort into an Address-Based Sample; 2017; Kott, P. S., Farrelly, M., Kamyab, K.
- Grundzüge des Datenschutzrechts und aktuelle Datenschutzprobleme in der Markt- und Sozialforschung; 2017; Schweizer, A.
- Data chunking for mobile web: effects on data quality; 2017; Lugtig, P. J.; Toepoel, V.
- Comparing data quality and cost from three modes of on-board transit surveys ; 2017; Agrawal, A. W.; Granger-Bevan, S.; W.; Newmark, G. L.; Nixon, H.
- Finding and Investigating Geographical Data Online; 2017; Martin, D.; Cockings, S.; Leung, S.
- Three Methods for Occupation Coding Based on Statistical Learning; 2017; Geweon, H.; Schonlau, L.; Blohum, M.; Steiner, St.
- Dynamic Question Ordering in Online Surveys; 2016; Early, K.; Mankoff, J.; Fienberg, S. E.
- How to use online surveys to understand human behaviour concerning window opening in terms of building...; 2016; Fabbri, K.
- Impact of satisficing behavior in online surveys on consumer preference and welfare estimates; 2016; Gao, Z.; House, L. A.; Bi, X.
- Targeted Appeals for Participation in Letters to Panel Survey Members; 2016; Lynn, P.
- Can we assess representativeness of cross-national surveys using the education variable?; 2016; Ortmanns, V.; Schneider, S.
- Methodological Aspects of Central Left-Right Scale Placement in a Cross-national Perspective; 2016; Scholz, E.; Zuell, C.
- Fieldwork Effort, Response Rate, and the Distribution of Survey Outcomes: A Multilevel Meta-analysis; 2016; Sturgis, P.; Williams, Jo.; Brunton-Smith, I.; Moore, J.
- Comparison of Face-to-Face and Web Surveys on the Topic of Homosexual Rights; 2016; Liu, M.; Wang, Yic.
- Question order sensitivity of subjective well-being measures: focus on life satisfaction, self-rated...; 2016; Lee, S.; McClain, C.; Webster, N.; Han, S.
- Web-Based Statistical Sampling and Analysis; 2016; Quinn, A.; Larson, K.